Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[EM] Compress dense ellpack. #10821

Merged
merged 13 commits into from
Sep 20, 2024
Merged

Conversation

trivialfis
Copy link
Member

@trivialfis trivialfis commented Sep 15, 2024

This helps reduce the memory copying needed for dense data. In addition, it helps reduce memory usage even if external memory is not used.

  • Decouple the number of symbols needed in the compressor with the number of features when the data is dense.
  • Remove the fetch call in the at_end_ iteration.
  • Reduce synchronization and kernel launches by using the uvector and ctx.

todos:

  • Add tests.
  • Perf check.

After this PR, we will:

  • Extend the compression to relatively dense data. We might reuse the parameter sparse_threshold from the CPU hist.
  • Tune the quantile sketching memory estimation to reduce memory usage.

@trivialfis trivialfis force-pushed the ext-ellpack-dense-1 branch 2 times, most recently from f9c153a to e8b2eca Compare September 17, 2024 17:04
This helps reduce the memory copying needed for dense data.

- Compress the dense ellpack pages by making the number of symbols constant to the number
of features.
- Avoid fetching data in the data source during `at_end_`.
- Cleanup and optimizations memory ops.
@trivialfis trivialfis changed the title [WIP][EM] Compress dense ellpack. [EM] Compress dense ellpack. Sep 18, 2024
@trivialfis trivialfis marked this pull request as ready for review September 18, 2024 20:59
@trivialfis
Copy link
Member Author

cc @rongou .

@trivialfis trivialfis merged commit 24241ed into dmlc:master Sep 20, 2024
30 checks passed
@trivialfis trivialfis deleted the ext-ellpack-dense-1 branch September 20, 2024 10:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants